Linear Algebra¶
- Linear algebra isthe branch of mathematics concerned with the study of vectors, vector spaces, linear transformations, and systems of linear equations.
- It explores the properties of matrices, vectors, and linear functions, applying principles of algebra to understand geometric concepts like lines and plane.
1.) Key Concepts¶
Scalar: A single number (e.g., temperature = 37°C).
Vector: A 1D array of numbers (e.g., [1, 2, 3]) → represents direction & magnitude.
Matrix: A 2D array of numbers (rows × columns).
Tensor: Higher-dimensional generalization of matrices (e.g., images in deep learning).
Scalar = 5, Vector = [1,2,3], Matrix=[1,2][3,4]
import numpy as np
s = 5 # scalar
v = np.array([1, 2, 3]) # vector
M = np.array([[1, 2], [3, 4]]) # matrix
T = np.random.rand(3, 3, 3) # tensor
print("Scalar:", s)
print("Vector:", v)
print("Matrix:\n", M)
print("Tensor shape:", T.shape)
Scalar: 5 Vector: [1 2 3] Matrix: [[1 2] [3 4]] Tensor shape: (3, 3, 3)
2.) Vector Operations¶
Vectors represent quantities with both direction and size (like force, velocity). Operations on vectors let us compare or combine them.
Addition:¶
Combine two vectors (head-to-tail rule).
Dot Product:¶
Measures similarity (cosine of angle between them). If dot product = 0 → vectors are orthogonal (independent).
a = np.array([1, 2])
b = np.array([3, 4])
print("Addition:", a + b)
print("Dot Product:", np.dot(a, b))
Addition: [4 6] Dot Product: 11
3.) Matrices Operations¶
Matrices are like containers of vectors.
Addition/Subtraction = element-wise.
Multiplication = composition of transformations (like rotating + scaling vectors). In data science, multiplication is used in linear regression (Xβ), neural networks (weights × inputs), etc.
A = np.array([[1, 2], [3, 4]])
B = np.array([[5, 6], [7, 8]])
print("Matrix Multiplication:\n", np.dot(A, B))
Matrix Multiplication: [[19 22] [43 50]]
4) Determinant & Inverse¶
Determinant measures how much a matrix scales space.
det = 0 → matrix squashes space → not invertible.
Inverse is like division for matrices.
If A.A(inverse) = I, then A-1 undoes the transformation of A.
A = np.array([[4, 7], [2, 6]])
print("Determinant:", np.linalg.det(A))
print("Inverse:\n", np.linalg.inv(A))
Determinant: 10.000000000000002 Inverse: [[ 0.6 -0.7] [-0.2 0.4]]
5.) Eigenvalues & Eigenvectors¶
- Eigenvectors are directions that do not change direction when multiplied by a matrix, only their length changes.
- Eigenvalues tell us how much they are stretched.
- Used in PCA (Principal Component Analysis) to find directions of maximum variance.
A = np.array([[2, 0], [0, 3]])
eigvals, eigvecs = np.linalg.eig(A)
print("Eigenvalues:", eigvals)
print("Eigenvectors:\n", eigvecs)
Eigenvalues: [2. 3.] Eigenvectors: [[1. 0.] [0. 1.]]
A = np.array([[3, 1], [1, 3]])
U, S, Vt = np.linalg.svd(A)
print("U:\n", U)
print("Singular values:", S)
print("V^T:\n", Vt)
U: [[-0.70710678 -0.70710678] [-0.70710678 0.70710678]] Singular values: [4. 2.] V^T: [[-0.70710678 -0.70710678] [-0.70710678 0.70710678]]